Skip to content

Conversation

@chr5tphr
Copy link
Owner

@chr5tphr chr5tphr commented Oct 13, 2022

  • change the core Hook to support the modification of multiple inputs and params
  • for this, now each input and parameter that requires a gradient will be hooked, and a backward, which is aware of which the current 'sink' is, will be called for each
  • use View instead of custom Identity to produce a .grad_fn

Note:

  • this may be a breaking change for custom hooks based on the old implementation

TODO:

  • finish implementation:
    • parameters have no grad_fn, and we cannot simply overwrite them with a view; hooking directly with tensor hooks is problematic when the parameters are used in different functions
    • there may be potentially a better approach than calling the backward function once per 'sink', although the current implementation may allow for better modularity - multiple outputs are still not supported, it may be worth to think how to do it, however, it may also be better to do this at a later stage
  • implement tests
    • new tests for the new functionality: multiple inputs and params in hooks
    • fix old tests that assume the use of Identity and are not sink-aware
  • add documentation

@chr5tphr chr5tphr mentioned this pull request Oct 13, 2022
- change the core Hook to support the modification of multiple inputs
  and params
- for this, now each input and parameter that requires a gradient will
  be hooked, and a backward, which is aware of which the current 'sink'
  is, will be called for each
- use View instead of custom Identity to produce a .grad_fn

Note:
- this may be a breaking change for custom hooks based on the old
  implementation

TODO:
- finish implementation:
    - parameters have no grad_fn, and we cannot simply overwrite them
      with a view; hooking directly with tensor hooks is problematic
      when the parameters are used in different functions
    - there may be potentially a better approach than calling the
      backward function once per 'sink', although the current
      implementation may allow for better modularity
    - multiple outputs are still not supported, it may be worth to think
      how to do it, however, it may also be better to do this at a later
      stage
- implement tests
  - new tests for the new functionality: multiple inputs and params in
    hooks
  - fix old tests that assume the use of Identity and are not sink-aware
- add documentation
@chr5tphr chr5tphr force-pushed the hook-multi-input-param branch from 5672160 to 8f11583 Compare November 4, 2022 17:01
chr5tphr added a commit that referenced this pull request Aug 10, 2023
- use additions to forward hooks in torch 2.0.0 to pass kwargs to
  pass keyword arguments
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation

TODO:

- finish draft and test implementation
- add tests
- add documentation

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
chr5tphr added a commit that referenced this pull request Apr 8, 2024
- use additions to forward hooks in torch 2.0.0 to pass kwargs to
  pass keyword arguments
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation

TODO:

- finish draft and test implementation
- add tests
- add documentation

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
chr5tphr added a commit that referenced this pull request Apr 9, 2024
- use additions to forward hooks in torch 2.0.0 to pass kwargs to
  pass keyword arguments
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation

TODO:

- attribution scores are currently wrong in BasicHook, likely an issue
  with the gradient inside BasicHook? Might be some cross-terms
  interacting that should not interact

- finish draft and test implementation
- add tests
- add documentation

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
chr5tphr added a commit that referenced this pull request Jul 12, 2025
- use additions to forward hooks in torch 2.0.0 to pass kwargs to
  pass keyword arguments
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation

TODO:

- attribution scores are currently wrong in BasicHook, likely an issue
  with the gradient inside BasicHook? Might be some cross-terms
  interacting that should not interact

- finish draft and test implementation
- add tests
- add documentation

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
chr5tphr added a commit that referenced this pull request Jul 12, 2025
- torch 2.0.0 allows us to to pass multiple args and kwargs to hooks
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation
- BasicHook still only processes a single input
- Hook checks the function signature to allow backwards-compatibility

TODO:

- add tests
- add documentation

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
chr5tphr added a commit that referenced this pull request Jul 12, 2025
- torch 2.0.0 allows us to to pass multiple args and kwargs to hooks
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation
- BasicHook still only processes a single input
- Hook checks the function signature to allow backwards-compatibility

TODO:

- add tests
- add documentation

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
- This does not deal with parameter gradients, which are better left to a seperate PR
- This will implement #176
chr5tphr added a commit that referenced this pull request Jul 12, 2025
- torch 2.0.0 allows us to to pass multiple args and kwargs to hooks
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation
- BasicHook still only processes a single input
- Hook checks the function signature to allow backwards-compatibility
- added a basic test that uses the kwargs-signature
- added a note that keyword arguments are supported in the documentation

Notes:

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
- This does not deal with parameter gradients, which are better left to a seperate PR

implements #176
chr5tphr added a commit that referenced this pull request Jul 12, 2025
- torch 2.0.0 allows us to to pass multiple args and kwargs to hooks
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation
- BasicHook still only processes a single input
- Hook checks the function signature to allow backwards-compatibility
- added a basic test that uses the kwargs-signature
- added a note that keyword arguments are supported in the documentation

Notes:

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
- This does not deal with parameter gradients, which are better left to a seperate PR

implements #176
chr5tphr added a commit that referenced this pull request Jul 13, 2025
- torch 2.0.0 allows us to to pass multiple args and kwargs to hooks
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation
- BasicHook still only processes a single input
- Hook checks the function signature to allow backwards-compatibility
- added a basic test that uses the kwargs-signature
- added a note that keyword arguments are supported in the documentation

Notes:

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
- This does not deal with parameter gradients, which are better left to a seperate PR

implements #176
chr5tphr added a commit that referenced this pull request Jul 13, 2025
- torch 2.0.0 allows us to to pass multiple args and kwargs to hooks
- handle multiple inputs and outputs in core.Hook and core.BasicHook, by
  passing all required grad_outputs and inputs to the backward
  implementation
- BasicHook still only processes a single input
- Hook checks the function signature to allow backwards-compatibility
- added a basic test that uses the kwargs-signature
- added a note that keyword arguments are supported in the documentation

Notes:

- This stands in conflict with #168, but promises a better
  implementation by handling inputs and outpus as common to a single
  function, rather than individually as proposed in #168
- This does not deal with parameter gradients, which are better left to a seperate PR

implements #176
@chr5tphr chr5tphr force-pushed the main branch 4 times, most recently from 3fbdb43 to 6204e31 Compare July 31, 2025 14:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant